- Problem: How to obtain the greeks (without going to Greece)
Methodology
2.1 Introducing the Greeks
2.2 A tour on Gaussian Processes
Results
3.1 Synthetic data
3.2 European options S&P500
Discussion
1st FMTC-Br, August, 2018.
Methodology
2.1 Introducing the Greeks
2.2 A tour on Gaussian Processes
Results
3.1 Synthetic data
3.2 European options S&P500
Discussion
Holding derivatives exposes you to market Risk
One can hedge exposure using appropriate techniques
The most basic techniques uses derivative sensitivities (Greeks)
But how can one compute these?
Collection of quantities used to measure sensitivity!
Delta (\(\Delta\))
Set up a market model:
What if we didn´t use a market model at all?
Let´s consider a stochastic model for functional data.
\[ \begin{eqnarray}\tag{01} y = f(\mathbf{X}) + \varepsilon, \quad \varepsilon \sim \mathcal{N}(\mathbf{0}, \mathbf{\sigma^2}) \end{eqnarray} \]
Assumptions about \(f(\cdot)\) can lead to 2 types of models (Rasmussen 2004)
Very flexible
Not possible to test all possible sets of functions
Def: A GP is a collection of RV \(f(\mathbf{x})_{x \in \mathbf{R}^D}\) such that for every collection of \(x^{(1)}, \ldots, x^{(N)}\), \(f(x^{(i)})_{i=1}^N\) is a gaussian vector.
Plenty of useful properties
A GP is completely specified by its mean and covariance functions
\[ \begin{eqnarray}\tag{02} m(\mathbf{x}) &= \mathbb{E}[f(\mathbf{x})] \end{eqnarray} \]
\[ \begin{eqnarray}\tag{03} k(\mathbf{x},\mathbf{x'}) &= \text{cov} (f(\mathbf{x}),f(\mathbf{x'})) \end{eqnarray} \]
Notation: \(f(\mathbf{x}) \sim \mathcal{GP}(m(\mathbf{x}),\ k(\mathbf{x},\mathbf{x'}))\)
\(k\) defines the correlation structure between the inputs!
Toy example: European Put Prices (Black-scholes model)
The conditional distribution \(f(\mathbf{x_*})|\mathbf{X}, \mathbf{y}\) is given by
\[ \begin{eqnarray} f(\mathbf{x}) | \mathbf{X}, \mathbf{y} &\sim \mathcal{GP}(\tilde{m}(\mathbf{x}), \tilde{k}(\mathbf{x}, \mathbf{x'})), \quad \text{where} \nonumber \end{eqnarray} \]
\[ \begin{eqnarray} \tilde{m}(\mathbf{x}) & = k(\mathbf{x_*}, \mathbf{X})[k(\mathbf{X},\mathbf{X})+\Sigma]^{-1}\mathbf{y}\quad \text{and} \nonumber \end{eqnarray} \]
\[ \begin{eqnarray} \tilde{k}(\mathbf{x},\mathbf{x'}) &= \tilde{k}(\mathbf{x},\mathbf{x'}) - k(\mathbf{x},\mathbf{X})^T[k(\mathbf{X},\mathbf{X})+ \Sigma]^{-1}k(\mathbf{x'},\mathbf{X}). \end{eqnarray} \]
Defines how two separated points are related to each other
If \(k\) is known and \(m(x) = 0\), the problem is reduced to apply the formulas from the previous slide
The choice of the kernel will have impact on the estimates
In our examples \(k\) could be
\[ \begin{eqnarray} k_{G}(|x-y|) &\equiv \exp \left(-\frac{|x-y|^2}{2\theta^2} \right) &\text{(Gaussian)}\nonumber \end{eqnarray} \]
\[ \begin{eqnarray} k_{M3}(|x-y|) & \equiv \left(1 + \frac{\sqrt{3}|x-y|}{\theta} \right)\exp \left\{-\frac{\sqrt{3}|x-y|}{\theta} \right\} &\text{(Matérn$_{3/2}$)} \nonumber \end{eqnarray} \]
\[ \begin{eqnarray} k_{M2}(|x-y|) & \equiv \left(1 + \frac{\sqrt{5}|x-y|}{\theta} + \frac{5 |x-y|^2}{3\theta^2} \right)\exp \left\{-\frac{\sqrt{5}|x-y|}{\theta} \right\} &\text{(Matérn$_{5/2}$)} \nonumber \end{eqnarray} \]
Drives the smoothness of the process for a given kernel
Proposition 1.
Given \(f(\mathbf{x})\sim \mathcal{GP}(m(\mathbf{x}),k(\mathbf{x},\mathbf{x'}))\) such that \(\frac{\partial^2 k}{\partial x_i\partial x'_i}(\mathbf{x},\mathbf{x'})\) and \(\frac{\partial m}{\partial x_i}(\mathbf{x})\) exists, then
\[ \begin{eqnarray} \frac{\partial f}{\partial x_i}(\mathbf{x})\sim \mathcal{GP}\left(\frac{\partial m}{\partial x_i}(\mathbf{x}),\frac{\partial^2 k}{\partial x'_i\partial x_i}(\mathbf{x},\mathbf{x'})\right)\nonumber \end{eqnarray} \]
Distribution of the derivative of a GP
Given \(f(\mathbf{x})\sim \mathcal{GP}(m(\mathbf{x}),k(\mathbf{x},\mathbf{x'}))\) such that \(\frac{\partial^2k}{\partial x_i\partial x'_i}(\mathbf{x},\mathbf{x'})\) and \(\frac{\partial m}{\partial x_i}(\mathbf{x'})\) exists, then
\[ \begin{eqnarray} \frac{\partial f}{\partial x_i}(\mathbf{x})|\mathbf{X},f(\mathbf{X}) \sim \mathcal{GP}&\big(\frac{\partial k}{\partial x_i}(\mathbf{x},\mathbf{X})k(\mathbf{X},\mathbf{X})^{-1}f(\mathbf{X}),\frac{\partial^2 k}{\partial x_i\partial y_i}(\mathbf{x},\mathbf{x'}) \nonumber\\ &- \frac{\partial k}{\partial x_i}(\mathbf{x},\mathbf{X})k(\mathbf{X},\mathbf{X})^{-1}\frac{\partial k}{\partial y_i}(\mathbf{X},\mathbf{x'})\big)\label{eq-2.13} \end{eqnarray} \]
Data generated from a BS model
We calculated the 'true' Deltas
Training data x test data
Different maturities, strikes, kernels, length-scales, test points in and out-of-sample
Evaluated in terms of rMSE, Bias and Coverage
Checked violations in positivity, monotonicity and convexity (pathwise)
Let's see how GP performs on real data
Data description
Rasmussen, Carl Edward. 2004. “Gaussian Processes in Machine Learning.” In Advanced Lectures on Machine Learning, 63–71. Springer.